coupling flow
Re: Coupling-based Invertible Neural Networks Are Universal Diffeomorphism Approximators (ID=1064)
Re: Coupling-based Invertible Neural Networks Are Universal Diffeomorphism Approximators (ID=1064). We thank the reviewers for reviewing our work. We will update the paper based on the suggestions. On what occasion would the diffeomorphic universality results be useful other than distribution approximation? Thank you for pointing out the missing references.
- North America > United States > New Jersey (0.04)
- Asia > Japan (0.04)
- Health & Medicine > Pharmaceuticals & Biotechnology (0.93)
- Materials > Chemicals (0.93)
- Health & Medicine > Therapeutic Area > Immunology (0.68)
- North America > United States > New Jersey (0.04)
- Europe > Germany > Bavaria > Upper Bavaria > Munich (0.04)
- Asia > Japan (0.04)
- North America > United States > New York (0.04)
- North America > United States > New Jersey (0.04)
- Asia > Japan (0.04)
- Health & Medicine > Pharmaceuticals & Biotechnology (0.93)
- Materials > Chemicals (0.93)
- Health & Medicine > Therapeutic Area > Immunology (0.68)
- North America > United States > New Jersey (0.04)
- Europe > Germany > Bavaria > Upper Bavaria > Munich (0.04)
- Asia > Japan (0.04)
Fast and Unified Path Gradient Estimators for Normalizing Flows
Vaitl, Lorenz, Winkler, Ludwig, Richter, Lorenz, Kessel, Pan
Recent work shows that path gradient estimators for normalizing flows have lower variance compared to standard estimators for variational inference, resulting in improved training. However, they are often prohibitively more expensive from a computational point of view and cannot be applied to maximum likelihood training in a scalable manner, which severely hinders their widespread adoption. In this work, we overcome these crucial limitations. Specifically, we propose a fast path gradient estimator which improves computational efficiency significantly and works for all normalizing flow architectures of practical relevance. We then show that this estimator can also be applied to maximum likelihood training for which it has a regularizing effect as it can take the form of a given target energy function into account. We empirically establish its superior performance and reduced variance for several natural sciences applications.
- North America > Canada (0.28)
- North America > United States > Maryland (0.14)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks (0.93)
- Information Technology > Artificial Intelligence > Machine Learning > Learning Graphical Models (0.68)
- Information Technology > Artificial Intelligence > Machine Learning > Statistical Learning (0.67)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Uncertainty > Bayesian Inference (0.54)
Bridging Mean-Field Games and Normalizing Flows with Trajectory Regularization
Huang, Han, Yu, Jiajia, Chen, Jie, Lai, Rongjie
Mean-field games (MFGs) are a modeling framework for systems with a large number of interacting agents. They have applications in economics, finance, and game theory. Normalizing flows (NFs) are a family of deep generative models that compute data likelihoods by using an invertible mapping, which is typically parameterized by using neural networks. They are useful for density modeling and data generation. While active research has been conducted on both models, few noted the relationship between the two. In this work, we unravel the connections between MFGs and NFs by contextualizing the training of an NF as solving the MFG. This is achieved by reformulating the MFG problem in terms of agent trajectories and parameterizing a discretization of the resulting MFG with flow architectures. With this connection, we explore two research directions. First, we employ expressive NF architectures to accurately solve high-dimensional MFGs, sidestepping the curse of dimensionality in traditional numerical methods. Compared with other deep learning approaches, our trajectory-based formulation encodes the continuity equation in the neural network, resulting in a better approximation of the population dynamics. Second, we regularize the training of NFs with transport costs and show the effectiveness on controlling the model's Lipschitz bound, resulting in better generalization performance. We demonstrate numerical results through comprehensive experiments on a variety of synthetic and real-life datasets.
Universality of parametric Coupling Flows over parametric diffeomorphisms
Lyu, Junlong, Chen, Zhitang, Feng, Chang, Cun, Wenjing, Zhu, Shengyu, Geng, Yanhui, Xu, Zhijie, Chen, Yongwei
Invertible neural networks (INNs) such as coupling flows are firstly introduced as a class of generative models with a tractable likelihood [11, 25, 40], and have shown their usefulness and powerfulness in various machine learning tasks such as inverse problems [2], probabilistic inference [29] and feature extraction [22] in recent years. With plenty of successful applications of INNs, one would wonder if such a type of models have the universal expressiveness. As most generative models mainly concern about the transform between distributions, existing works such as [19, 23] focused on the expressiveness from the distribution perspective. However, the expressiveness from the distribution perspective does not result in the expressiveness from the mapping perspective, as there are a large (or even infinite) number of diffeomorphisms mapping the given source µ to the given target ν. In many applications, knowing the distributional universality is not yet enough, one may be interested in knowing if the optimal transport [41], which finds emerging applications in many fields, e.g., machine learning [32], wireless communication [30] and economics [15], can be approximated by invertible neural networks.